🎉 One Paper Has Been Accepted by CVPR 2023
CVPR 2023 officially released the list of accepted papers. We are thrilled to announce that Dr. Li's paper from our team has been included!
📄 Boosting Low-Data Instance Segmentation by Unsupervised Pre-training with Saliency Prompt
Authors: Hao Li (李昊), Dingwen Zhang (张鼎文), Junwei Han (韩军伟)
Conference: CVPR 2023
Research Background
Recently, inspired by DETR variants, query-based end-to-end instance segmentation (QEIS) methods have outperformed CNN-based models on large-scale datasets.
Yet they would lose efficacy when only a small amount of training data is available since it's hard for the crucial queries/kernels to learn localization and shape priors.
Key Contributions
To this end, this work offers a novel unsupervised pre-training solution for low-data regimes. Inspired by the recent success of the Prompting technique, we introduce a new pre-training method that boosts QEIS models by giving Saliency Prompt for queries/kernels.
Our method contains three parts:
1. Saliency Masks Proposal
This component is responsible for generating pseudo masks from unlabeled images based on the saliency mechanism. By leveraging visual saliency, we can automatically identify and segment prominent objects without manual annotation.
2. Prompt-Kernel Matching
This part transfers pseudo masks into prompts and injects the corresponding localization and shape priors to the best-matched kernels. This innovative approach allows the model to learn from unlabeled data effectively.
3. Kernel Supervision
Kernel Supervision is applied to supply supervision at the kernel level for robust learning. This ensures that each kernel learns meaningful and discriminative features.
Practical Impact
From a practical perspective, our pre-training method helps QEIS models achieve similar convergence speed and comparable performance with CNN-based models in low-data regimes.
This is a significant breakthrough as it enables state-of-the-art instance segmentation models to work effectively even when training data is scarce, which is a common scenario in many real-world applications.
Experimental Results
Experimental results show that our method significantly boosts several QEIS models on three datasets. The comprehensive evaluation demonstrates the effectiveness and generalizability of our proposed approach across different architectures and datasets.
The results validate that saliency-based unsupervised pre-training is a promising direction for addressing the low-data challenge in instance segmentation tasks.
Conclusion
This acceptance at CVPR 2023 represents a significant contribution to the instance segmentation field. By introducing saliency prompts for unsupervised pre-training, our work bridges the gap between query-based and CNN-based methods in low-data scenarios.
Congratulations to Dr. Hao Li (李昊) and all co-authors for this outstanding achievement! 🎊